首页> 外文OA文献 >A Dual-Stage Attention-Based Recurrent Neural Network for Time Series Prediction
【2h】

A Dual-Stage Attention-Based Recurrent Neural Network for Time Series Prediction

机译:基于双阶段注意的时间序列递归神经网络   预测

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

The Nonlinear autoregressive exogenous (NARX) model, which predicts thecurrent value of a time series based upon its previous values as well as thecurrent and past values of multiple driving (exogenous) series, has beenstudied for decades. Despite the fact that various NARX models have beendeveloped, few of them can capture the long-term temporal dependenciesappropriately and select the relevant driving series to make predictions. Inthis paper, we propose a dual-stage attention-based recurrent neural network(DA-RNN) to address these two issues. In the first stage, we introduce an inputattention mechanism to adaptively extract relevant driving series (a.k.a.,input features) at each time step by referring to the previous encoder hiddenstate. In the second stage, we use a temporal attention mechanism to selectrelevant encoder hidden states across all time steps. With this dual-stageattention scheme, our model can not only make predictions effectively, but canalso be easily interpreted. Thorough empirical studies based upon the SML 2010dataset and the NASDAQ 100 Stock dataset demonstrate that the DA-RNN canoutperform state-of-the-art methods for time series prediction.
机译:非线性自回归外生(NARX)模型已经研究了数十年,该模型基于其先前值以及多个驱动(外生)序列的当前值和过去值来预测时间序列的当前值。尽管已经开发了各种NARX模型,但是很少有模型能够适当地捕获长期的时间依赖性并选择相关的驾驶序列进行预测。在本文中,我们提出了一个基于双阶段注意力的循环神经网络(DA-RNN)来解决这两个问题。在第一阶段,我们引入了一种输入注意机制,通过参考先前的编码器隐藏状态,在每个时间步自适应地提取相关的驾驶序列(又称为输入特征)。在第二阶段,我们使用时间注意机制在所有时间步长中选择相关的编码器隐藏状态。通过这种双阶段注意方案,我们的模型不仅可以有效地进行预测,而且可以轻松地进行解释。基于SML 2010数据集和纳斯达克100股票数据集的深入实证研究表明,DA-RNN可以胜任用于时间序列预测的最新方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号